IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Improved Bounds for Probability Metrics and f- Divergences

نویسنده

  • Igal Sason
چکیده

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter, capacitory discrimination, chi-squared divergence, Chernoff information, Hellinger distance, relative entropy, and the total variation distance. The presentation is aimed to be self-contained. Index Terms – Bhattacharyya parameter, capacitory discrimination, Chernoff information, chi-squared divergence, f -divergence, Hellinger distance, relative entropy, total variation distance.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Bounds on f-Divergences and Related Distances

Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Improved Lower Bounds on the Total Variation Distance for the Poisson Approximation

New lower bounds on the total variation distance between the distribution of a sum of independent Bernoulli random variables and the Poisson random variable (with the same mean) are derived via the Chen-Stein method. The new bounds rely on a non-trivial modification of the analysis by Barbour and Hall (1984) which surprisingly gives a significant improvement. A use of the new lower bounds is ad...

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Tightened Exponential Bounds for Discrete Time, Conditionally Symmetric Martingales with Bounded Jumps

This letter derives some new exponential bounds for discrete time, real valued, conditionally symmetric martingales with bounded jumps. The new bounds are extended to conditionally symmetric sub/ supermartingales, and are compared to some existing bounds.

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Entropy Bounds for Discrete Random Varibles via Coupling

This paper derives new entropy bounds for discrete random variables via maximal coupling. It provides bounds on the difference between the entropies of two discrete random variables in terms of the local and total variation distances between their probability mass functions. These bounds address cases of finite or countable infinite alphabets. Particular cases of these bounds reproduce some kno...

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Reverse Pinsker Inequalities

New upper bounds on the relative entropy are derived as a function of the total variation distance. One bound refines an inequality by Verdú for general probability measures. A second bound improves the tightness of an inequality by Csiszár and Talata for arbitrary probability measures that are defined on a common finite set. The latter result is further extended, for probability measures on a ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014